Weight initialization methods for multilayer feedforward
نویسندگان
چکیده
In this paper, we present the results of an experimental comparison among seven different weight initialization methods in twelve different problems. The comparison is performed by measuring the speed of convergence, the generalization capability and the probability of successful convergence. It is not usual to find an evaluation of the three properties in the papers on weight initialization. The training algorithm was Backpropagation (BP) with a hyperbolic tangent transfer function. We found that the performance can be improved with respect to the usual initialization scheme.
منابع مشابه
IDIAP Technical report
Proper initialization is one of the most important prerequisites for fast convergence of feed-forward neural networks like high order and multilayer perceptrons. This publication aims at determining the optimal value of the initial weight v ariance (or range), which is the principal parameter of random weight initialization methods for both types of neural networks. An overview of random weight...
متن کاملMapping Some Functions and Four Arithmetic Operations to Multilayer Feedforward Neural Networks
This paper continues the development of a heuristic initialization methodology for designing multilayer feedforward neural networks aimed at modeling nonlinear functions for engineering mechanics applications as presented previously at IMAC XXIV and XXV. Seeking a transparent and domain knowledge-based approach for neural network initialization and result interpretation, this study examines the...
متن کاملDynamic tunneling technique for efficient training of multilayer perceptrons
A new efficient computational technique for training of multilayer feedforward neural networks is proposed. The proposed algorithm consists two learning phases. The first phase is a local search which implements gradient descent, and the second phase is a direct search scheme which implements dynamic tunneling in weight space avoiding the local trap thereby generates the point of next descent. ...
متن کاملA New Weight Initialization Method Using Cauchy’s Inequality Based on Sensitivity Analysis
In this paper, an efficient weight initialization method is proposed using Cauchy’s inequality based on sensitivity analysis to improve the convergence speed in single hidden layer feedforward neural networks. The proposed method ensures that the outputs of hidden neurons are in the active region which increases the rate of convergence. Also the weights are learned by minimizing the sum of squa...
متن کاملThe Method of Steepest Descent for Feedforward Artificial Neural Networks
In this paper, we implement the method of Steepest Descent in single and multilayer feedforward artificial neural networks. In all previous works, all the update weight equations for single or multilayer feedforward artificial neural networks has been calculated by choosing a single activation function for various processing unit in the network. We, at first, calculate the total error function ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2001